2,731 research outputs found
The long-term prognostic significance of 6-minute walk test distance in patients with chronic heart failure
Background. The 6-minute walk test (6-MWT) is used to assess patients with chronic heart failure (CHF). The prognostic significance of the 6-MWT distance during long-term followup ( > 5 years) is unclear. Methods. 1,667 patients (median [inter-quartile range, IQR]) (age 72 [65-77] ; 75% males) with heart failure due to left ventricular systolic impairment undertook a 6-MWT as part of their baseline assessment and were followed up for 5 years. Results. At 5 years' followup, those patients who died (n = 959) were older at baseline and had a higher log NT pro-BNP than those who survived to 5 years (n = 708). 6-MWT distance was lower in those who died [163 (153) m versus 269 (160) m; P 360 m. 6-MWT distance was a predictor of all-cause mortality (HR 0.97; 95% CI 0.96-0.97; Chi-square = 184.1; P < 0.0001). Independent predictors of all-cause mortality were decreasing 6-MWT distance, increasing age, increasing NYHA classification, increasing log NT pro-BNP, decreasing diastolic blood pressure, decreasing sodium, and increasing urea. Conclusion. The 6-MWT is an important independent predictor of all-cause mortality following long-term followup in patients with CHF. © 2014 Lee Ingle et al
Some aspects of organonitrogen transition metal complexes
Attempts to introduce amidino ligands into Transition metal carbonyl and non carbonyl systems are described, and a new synthetic route to transition metal methyleneamino complexes is explored. Reaction of lithiodiaryl -acetamidines and -benzamidines with Re(CO) (_5)X (X = C1,Br) produced carbamoyl type complexes, Re(CO)(_4)(CONR-CR'-NR) containing a bidentate carbamoyl-amidino ligand. These complexes could be decarbonylated, forming the chelated amidino complexes Re(CO)(_4)(RN-CR'-NR) by heating. The chelated amidino complexes were also prepared by the direct action of lithioamidines on the rhenium carbonyhalide dimer [Re(CO)(_4)X](_2), and by the reaction of n-butyllithium with the simple two electron donor monodentate amidine complexes, Re(CO)(_4)(Amidine)X. Complexes containing ortho-metallated diaryl -formamidino, -acetamidino and -benzamidino ligands were prepared by the action of the parent amidine on Re(CO)(_5)X or Re(CO)(_4)(Amidine)X in refluxing monoglyme. These complexes also contained a simple two electron donor monodentate amidine. The acetamidine and formamidine derivatives formed six membered ortho-metallate rings, the benzamidines forming either five or six membered ring complexes. In contrast, refluxing Re(CO)(_4)(Amidine)X with PPh(_3) in monoglyme produced a chelated amidino species, Re(C0)(_3)(PPh(_3))(RN-CR'-NR). The reaction mechanisms for the formation of the above complexes, and their probable bonding modes are discussed. Interconversions between many of the species were possible. FeCl(_2) reacts with lithiodiarylamidines to produce [Fe(amidino)(_3)](_n) type complexes. The acetamidines and formamidines form monomeric complexes (with a probable tris chelate structure), and the benzamidine forms both a monomeric and an oligomeric/polymeric complex. FeCl reacts similarly with lithiumdi-p-tolylacetamidine, both monomeric and oligomeric/polymeric [Fetamidine)(_2)](_n) species being produced. The complexes are paramagnetic, the Fe(III) species have magnetic moments that suggest they contain five unpaired electrons. Nucleophilic attack by n-butyllithium at the carbon atom of [CpMo(CO)(AsPh(_3)) (NCPh) ](^+) did not produce a methyleneamino complex. Free nitrile was liberated in the reaction, and dimeric molybdenum complexes were produced
Natural anti-realism
The thesis defines and examines a position ('natural
anti-realism') which combines an anti-realist semantics with an
evolutionary epistemology. An anti-realist semantics, by requiring
that a theory of meaning be also a theory of understanding, cries
out for an explicit epistemological component. In urging an
evolutionary epistemology as such a component, I seek to preserve
and underscore the semantic insights of the anti-realist whilst
deflecting the common criticism that the anti-realist must perforce
embrace some form of noxious idealism.
An evolutionary epistemology, I argue, can provide a distinctive
content for the belief that reality is independent of human thought
without needing to claim that anything we can say or think about
the world can be conceived as being true or false in full independence
of our capacity to know it as such. This content is to be secured
in two ways. The first is to observe that language is best understood
as a tool of minds which are themselves best understood as the
products of a natural process operating in an independently real
world. The second is to form a non-transcendent conception of
transcendent facts. The accessible evidence concerning the form
of the selective process, it is argued, warrants the claim that
reality may exceed its humanly accessible contours. For it warrants
the claim that man is probably cognitively limited and biased in
ways rooted in our peculiar, and somewhat contingent, evolutionary
past. The natural anti-realist thus conceives of reality as both
independent of, and potentially transcending the limits of, man's
particular mental orientation. A largely realistic metaphysics may
thus accompany an anti-realist semantics without the lapse into
vacuity or incoherence which some commentators seem to fear
Recommended from our members
An investigation into the feasibility, problems and benefits of re-engineering a legacy procedural CFD code into an event driven, object oriented system that allows dynamic user interaction
This research started with questions about how the overall efficiency, reliability and ease-of-use of Computational Fluid Dynamics (CFD) codes could be improved using any available software engineering and Human Computer Interaction (HCI) techniques. Much of this research has been driven by the difficulties experienced by novice CFD users in the area of Fire Field Modelling where the introduction of performance based building regulations have led to a situation where non CFD experts are increasingly making use of CFD techniques, with varying degrees of effectiveness, for safety critical research. Formerly, such modelling has not been helped by the mode of use, high degree of expertise required from the user and the complexity of specifying a simulation case. Many of the early stages of this research were channelled by perceived limitations of the original legacy CFD software that was chosen as a framework for these investigations. These limitations included poor code clarity, bad overall efficiency due to the use of batch mode processing, poor assurance that the final results presented from the CFD code were correct and the requirement for considerable expertise on the part of users.
The innovative incremental re-engineering techniques developed to reverse-engineer, re-engineer and improve the internal structure and usability of the software were arrived at as a by-product of the research into overcoming the problems discovered in the legacy software. The incremental reengineering methodology was considered to be of enough importance to warrant inclusion in this thesis. Various HCI techniques were employed to attempt to overcome the efficiency and solution correctness problems. These investigations have demonstrated that the quality, reliability and overall run-time efficiency of CFD software can be significantly improved by the introduction of run-time monitoring and interactive solution control. It should be noted that the re-engineered CFD code is observed to run more slowly than the original FORTRAN legacy code due, mostly, to the changes in calling architecture of the software and differences in compiler optimisation: but, it is argued that the overall effectiveness, reliability and ease-of-use of the prototype software are all greatly improved. Investigations into dynamic solution control (made possible by the open software architecture and the interactive control interface) have demonstrated considerable savings when using solution control optimisation. Such investigations have also demonstrated the potential for improved assurance of correct simulation when compared with the batch mode of processing found in most legacy CFD software. Investigations have also been conducted into the efficiency implications of using unstructured group solvers.
These group solvers are a derivation of the simple point-by-point Jaccobi Over Relaxation (JOR) and Successive Over Relaxation (SOR) solvers [CROFT98] and using group solvers allows the computational processing to be more effectively targeted on regions or logical collections of cells that require more intensive computation. Considerable savings have been demonstrated for the use of both static- and dynamic- group membership when using these group solvers for a complex 3-imensional fire modelling scenario. Furthermore the improvements in the system architecture (brought about as a result of software re-engineering) have helped to create an open framework that is both easy to comprehend and extend. This is in spite of the underlying unstructured nature of the simulation mesh with all of the associated complexity that this brings to the data structures. The prototype CFD software framework has recently been used as the core processing module in a commercial Fire Field Modelling product (called "SMARTFIRE" [EWER99-1]). This CFD framework is also being used by researchers to investigate many diverse aspects of CFD technology including Knowledge Based Solution Control, Gaseous and Solid Phase Combustion, Adaptive Meshing and CAD file interpretation for ease of case specification
The Optimisation of Stochastic Grammars to Enable Cost-Effective Probabilistic Structural Testing
The effectiveness of probabilistic structural testing depends on the characteristics of the probability distribution from which test inputs are sampled at random. Metaheuristic search has been shown to be a practical method of optimis- ing the characteristics of such distributions. However, the applicability of the existing search-based algorithm is lim- ited by the requirement that the software’s inputs must be a fixed number of numeric values. In this paper we relax this limitation by means of a new representation for the probability distribution. The repre- sentation is based on stochastic context-free grammars but incorporates two novel extensions: conditional production weights and the aggregation of terminal symbols represent- ing numeric values. We demonstrate that an algorithm which combines the new representation with hill-climbing search is able to effi- ciently derive probability distributions suitable for testing software with structurally-complex input domains
Diuretic Treatment in Patients with Heart Failure: Current Evidence and Future Directions-Part II: Combination Therapy
Purpose of Review Fluid retention or congestion is a major cause of symptoms, poor quality of life, and adverse outcome in patients with heart failure (HF). Despite advances in disease-modifying therapy, the mainstay of treatment for congestion loop diuretics-has remained largely unchanged for 50 years. In these two articles (part I: loop diuretics and part II: combination therapy), we will review the history of diuretic treatment and current trial evidence for different diuretic strategies and explore potential future directions of research. Recent Findings We will assess recent trials, including DOSE, TRANSFORM, ADVOR, CLOROTIC, OSPREY-AHF, and PUSH-AHF, and assess how these may influence current practice and future research. Summary There are few data on which to base diuretic therapy in clinical practice. The most robust evidence is for high-dose loop diuretic treatment over low-dose treatment for patients admitted to hospital with HF, yet this is not reflected in guidelines. There is an urgent need for more and better research on different diuretic strategies in patients with HF
Recommended from our members
A new methodology for learning design
This paper describes the development of a new methodology for learning design. Our approach is predicated on the view that no one, simple, view of design is appropriate, because of the inherently messy and creative nature of design. Instead we are adopting an interactive and multi-faceted approach which consists of a series of cycles of user consultation, focus groups and workshops alongside the development of learning design tools and resources. In particular we will describe how we have adapted an existing mind mapping and argumentation tool, Compendium, so that it can be used as a means of guiding designers through the learning design decision making process in the creation of learning activities. We will describe the initial evaluations of the use of this tool, along with our findings to date on a series of fact finding exercises to better understand individual and team approaches to design
Microbiome Diversity and Differential Abundances Associated with BMI, Immune Markers, and Fecal Short Chain Fatty Acids Before and After Synbiotic Supplementation
The gut microbiota and its metabolites – namely short chain fatty acids (SCFAs) – interact with the digestive, immune, and nervous systems. Microbiota with disrupted composition are highly associated with obesity, gastrointestinal symptoms, and chronic inflammation. Levels of SCFAs in the feces can represent dynamics of the microbiota, and they represent one mechanism by which the microbiota interacts with its host. This study aimed to further our understanding of associations between microbiota bacterial diversity and SCFAs, immune markers, BMI, and GI symptoms and to identify bacteria that are differentially abundant in different BMI groups and with synbiotic supplementation. Data (SCFAs, immunoglobulins, body mass index, fecal fiber, fecal protein, measures of GI symptoms, and 16s RNA sequences, n=11) was extracted from a randomized control trial investigating the effects of synbiotic supplementation in non-celiac gluten-sensitive participants. QIIME2 was used to process 16s RNA data, analyze quantitative, qualitative, phylogenetic quantitative, and phylogenetic qualitative measures of alpha and beta diversity and to perform an analysis of composition of microbiomes (ANCOM) for identification of differential abundances. Multiple metrics of alpha diversity were found to significantly correlate with IgG4, IgM, IL-2, acetate, propionate, isobutyrate, valerate, isovalerate, caproate, heartburn, urgent need to defecate, and feelings of incomplete evacuation. Multiple metrics of beta diversity were significantly different between normal and overweight, normal and obese, and overweight and obese BMI classification groups. Beta diversity was also found to significantly correlate with IgG1, IgG3, IgG4, IgA, IL-6, IL-8, fecal fiber, propionate, butyrate, heartburn, acid regurgitation, nausea and vomiting, bloating, abdominal distension, increased gas, and eructation. The synbiotic intervention did not significantly alter alpha or beta diversity. An ANCOM identified bacterial taxa differentially abundant with BMI shifts and synbiotic supplementation, though these taxa were not those included in the synbiotic. Findings demonstrate alpha and beta diversity associations with various SCFAs, GI symptoms, immune markers, and BMI, and the results of the placebo-controlled intervention suggest careful consideration of placebo contents moving forward. This research supports plans to apply analysis to larger sample sizes to elucidate changes microbial profiles that are associated with clinically relevant biomarkers and symptoms
Prevention or procrastination for heart failure?: Why we need a universal definition of heart failure
No abstract available
- …